10 research outputs found

    Biometrics for Emotion Detection (BED): Exploring the combination of Speech and ECG

    Get PDF
    The paradigm Biometrics for Emotion Detection (BED) is introduced, which enables unobtrusive emotion recognition, taking into account varying environments. It uses the electrocardiogram (ECG) and speech, as a powerful but rarely used combination to unravel peopleā€™s emotions. BED was applied in two environments (i.e., office and home-like) in which 40 people watched 6 film scenes. It is shown that both heart rate variability (derived from the ECG) and, when peopleā€™s gender is taken into account, the standard deviation of the fundamental frequency of speech indicate peopleā€™s experienced emotions. As such, these measures validate each other. Moreover, it is found that peopleā€™s environment can indeed of influence experienced emotions. These results indicate that BED might become an important paradigm for unobtrusive emotion detection

    Biosignals as an Advanced Man-Machine Interface

    Get PDF
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such an MMI requires the correct classification of biosignals to emotion classes. This paper explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 24 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for both personalized biosignal-profiles and the recording of multiple biosignals in parallel

    Perceiving emotions through psychophysiological signals

    Get PDF
    Emotions influence our cognitive functioning heavily. Therefore, it is interesting to develop measurement techniques that can record experienced emotions. Moreover, to improve user system interaction, computers need to recognize and respond properly to their user's emotional state. This would enable affective computing, which relates to, arises from, or deliberately influences emotion. A range of experiments will be discussed in which a range of psychophysiological measures are applied to penetrate human emotion space. Hereby, we distinguish three facets: the obtrusiveness and noise sensitivity of the measures and the ecological validity of the research. Several statistical parameters were derived from physiological measurements of three electromyography signals: frontalis (EMG1), corrugator supercilii (EMG2), and zygomaticus major (EMG3). In one experiment, 24 participants were asked to watch film scenes of 120 seconds, which they rated afterward. These ratings enabled us to distinguish four categories of emotions: negative, positive, mixed, and neutral. Using the EMG2 and EMG3, discrimination between the four emotion categories was possible. In two other experiments, the 26 participants were asked to read out a story and to relive a recent anxious experience and speak about it. The latter enabled us to determine the amount of experienced arousal. In addition to the three experiments, experiences with galvanic skin conductance and heart rate variability will be discussed. In all instances, real time processing of the signals proved to be possible. This enables tailored user system interaction, facilitated by an emotional awareness of systems. Such systems could, for example, be applied to increase the immersion of participants in games, in ambient intelligence settings, incorporating a Personalized Empathic Computing (PEC), or in telepsychiatry settings. Such systems would introduce a new era in user system interaction

    Unobtrusive Sensing of Emotions (USE)

    Get PDF
    Emotions are acknowledged as a crucial element for artificial intelligence; this is, as is illustrated, no different for Ambient Intelligence (AmI). Unobtrusive Sensing of Emotions (USE) is introduced to enrich AmI with empathic abilities. USE coins the combination of speech and the electrocardiogram (ECG) as a powerful and unique combination to unravel people's emotions. In a controlled study, 40 people watched film scenes, in either an office or a home-like setting. It is shown that, when people's gender is taken into account, both heart rate variability (derived from the ECG) and the standard deviation of the fundamental frequency of speech indicate people's experienced valence and arousal, in parallel. As such, both measures validate each other. Thus, through USE reliable cues can be derived that indicate the emotional state of people, in particular when also people's environment is taken into account. Since all this is crucial for both AI and true AmI, this study provides a first significant leap forward in making AmI a success

    Biometrics for Emotion Detection (BED): Exploring the combination of Speech and ECG

    No full text
    The paradigm Biometrics for Emotion Detection (BED) is introduced, which enables unobtrusive emotion recognition, taking into account varying environments. It uses the electrocardiogram (ECG) and speech, as a powerful but rarely used combination to unravel peopleā€™s emotions. BED was applied in two environments (i.e., office and home-like) in which 40 people watched 6 film scenes. It is shown that both heart rate variability (derived from the ECG) and, when peopleā€™s gender is taken into account, the standard deviation of the fundamental frequency of speech indicate peopleā€™s experienced emotions. As such, these measures validate each other. Moreover, it is found that peopleā€™s environment can indeed of influence experienced emotions. These results indicate that BED might become an important paradigm for unobtrusive emotion detection

    Biosignals as an Advanced Man-Machine Interface.

    No full text
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such an MMI requires the correct classification of biosignals to emotion classes. This paper explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 24 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for both personalized biosignal-profiles and the recording of multiple biosignals in parallel

    Affective Man-Machine Interface: Unveiling human emotions through biosignals

    No full text
    As is known for centuries, humans exhibit an electrical profile. This profile is altered through various psychological and physiological processes, which can be measured through biosignals; e.g., electromyography (EMG) and electrodermal activity (EDA). These biosignals can reveal our emotions and, as such, can serve as an advanced man-machine interface (MMI) for empathic consumer products. However, such a MMI requires the correct classification of biosignals to emotion classes. This chapter starts with an introduction on biosignals for emotion detection. Next, a state-of-the-art review is presented on automatic emotion classification. Moreover, guidelines are presented for affective MMI. Subsequently, a research is presented that explores the use of EDA and three facial EMG signals to determine neutral, positive, negative, and mixed emotions, using recordings of 21 people. A range of techniques is tested, which resulted in a generic framework for automated emotion classification with up to 61.31% correct classification of the four emotion classes, without the need of personal profiles. Among various other directives for future research, the results emphasize the need for parallel processing of multiple biosignals

    Communication and Persuasion Technology: Psychophysiology of Emotions and User-Profiling

    Get PDF
    A theoretical framework for communication and persuasion technology is introduced, utilizing peopleā€™s emotions and personality characteristics. It uses two unobtrusive psychophysiological measures to penetrate peopleā€™s emotional space: heart rate variability and the variability of the fundamental frequency of the pitch of the voice. In addition, two experiments are described that validate these measures. Future systems can utilize such technology to sense peopleā€™s emotions and adopt suitable persuasion strategies

    Computing emotion awareness through facial electromyography

    No full text
    To improve human-computer interaction (HCI), computers need to recognize and respond properly to their userā€™s emotional state. This is a fundamental application of affective computing, which relates to, arises from, or deliberately influences emotion. As a first step to a system that recognizes emotions of individual users, this research focuses on how emotional experiences are expressed in six parameters (i.e., mean, absolute deviation, standard deviation, variance, skewness, and kurtosis) of physiological measurements of three electromyography signals: frontalis (EMG1), corrugator supercilii (EMG2), and zygomaticus major (EMG3). The 24 participants were asked to watch film scenes of 120 seconds, which they rated afterward. These ratings enabled us to distinguish four categories of emotions: negative, positive, mixed, and neutral. The skewness of the EMG2 and four parameters of EMG3, discriminate between the four emotion categories. This, despite the coarse time windows that were used. Moreover, rapid processing of the signals proved to be possible. This enables tailored HCI facilitated by an emotional awareness of systems
    corecore